On the Global Convergence of Broyden ' s Method

نویسندگان

  • J. A. Trangenstein
  • J. A. TRANGENSTEIN
چکیده

We consider Broyden's 1965 method for solving nonlinear equations. If the mapping is linear, then a simple modification of this method guarantees global and Qsuperlinear convergence. For nonlinear mappings it is shown that the hybrid strategy for nonlinear equations due to Powell leads to R-superlinear convergence provided the search directions form a uniformly linearly independent sequence. We then explore this last concept and its connection with Broyden's method. Finally, we point out how the above results extend to Powell's symmetric version of Broyden's method.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Improved Damped Quasi-Newton Methods for Unconstrained Optimization∗

Recently, Al-Baali (2014) has extended the damped-technique in the modified BFGS method of Powell (1978) for Lagrange constrained optimization functions to the Broyden family of quasi-Newton methods for unconstrained optimization. Appropriate choices for the damped-parameter, which maintain the global and superlinear convergence property of these methods on convex functions and correct the Hess...

متن کامل

On the superlinear convergence of the variable metric proximal point algorithm using Broyden and BFGS matrix secant updating

In previous work, the authors provided a foundation for the theory of variable metric proximal point algorithms in Hilbert space. In that work conditions are developed for global, linear, and super–linear convergence. This paper focuses attention on two matrix secant updating strategies for the finite dimensional case. These are the Broyden and BFGS updates. The BFGS update is considered for ap...

متن کامل

On the Behavior of Damped Quasi-Newton Methods for Unconstrained Optimization

We consider a family of damped quasi-Newton methods for solving unconstrained optimization problems. This family resembles that of Broyden with line searches, except that the change in gradients is replaced by a certain hybrid vector before updating the current Hessian approximation. This damped technique modifies the Hessian approximations so that they are maintained sufficiently positive defi...

متن کامل

Convergence analysis of a modified BFGS method on convex minimizations

A modified BFGS method is proposed for unconstrained optimization. The global convergence and the superlinear convergence of the convex functions are established under suitable assumptions. Numerical results show that this method is interesting.

متن کامل

Global convergence of online limited memory BFGS

Global convergence of an online (stochastic) limited memory version of the Broyden-FletcherGoldfarb-Shanno (BFGS) quasi-Newton method for solving optimization problems with stochastic objectives that arise in large scale machine learning is established. Lower and upper bounds on the Hessian eigenvalues of the sample functions are shown to suffice to guarantee that the curvature approximation ma...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010